14,365 research outputs found

    Binary codes with disjoint codebooks and mutual Hamming distance

    Get PDF
    Equal-length linear binary block error-control codes with disjoint codebooks and mutual Hamming distance are considered. A method of constructing pairs of these disjoint codes from known cyclic codes, and determining their mutual distance, is described. Some sets of length-15 cyclic codes are tabulated

    Time to publication for NIHR HTA programme-funded research: a cohort study

    Get PDF
    ObjectiveTo assess the time to publication of primary research and evidence syntheses funded by the National Institute for Health Research (NIHR) Health Technology Assessment (HTA) Programme published as a monograph in Health Technology Assessment and as a journal article in the wider biomedical literature.Study designRetrospective cohort study.SettingPrimary research and evidence synthesis projects funded by the HTA Programme were included in the cohort if they were registered in the NIHR research programmes database and was planned to submit the draft final report for publication in Health Technology Assessment on or before 9 December 2011.Main outcome measuresThe median time to publication and publication at 30?months in Health Technology Assessment and in an external journal were determined by searching the NIHR research programmes database and HTA Programme website.ResultsOf 458 included projects, 184 (40.2%) were primary research projects and 274 (59.8%) were evidence syntheses. A total of 155 primary research projects had a completion date; the median time to publication was 23?months (26.5 and 35.5?months to publish a monograph and to publish in an external journal, respectively) and 69% were published within 30?months. The median time to publication of HTA-funded trials (n=126) was 24?months and 67.5% were published within 30?months. Among the evidence syntheses with a protocol online date (n=223), the median time to publication was 25.5?months (28?months to publication as a monograph), but only 44.4% of evidence synthesis projects were published in an external journal. 65% of evidence synthesis studies had been published within 30.0?months.ConclusionsResearch funded by the HTA Programme publishes promptly. The importance of Health Technology Assessment was highlighted as the median time to publication was 9?months shorter for a monograph than an external journal article

    Lifetime analyses of error-control coded semiconductor RAM systems

    Get PDF
    The paper is concerned with developing quantitative results on the lifetime of coded random-access semiconductor memory systems. Although individual RAM chips are highly reliable, when large numbers of chips are combined to form a large memory system, the reliability may not be sufficiently high for the given application. In this case, error-correction coding is used to improve the reliability and hence the lifetime of the system. Formulas are developed which will enable the system designer to calculate the improvement in lifetime (over an uncoded system) for any particular coding scheme and size of memory. This will enable the designer to see if a particular memory system gives the required reliability, in terms of hours of lifetime, for the particular application. In addition, the designer will be able to calculate the percentage of identical systems that will, on average, last a given length of time

    Soft-decision minimum-distance sequential decoding algorithm for convolutional codes

    Get PDF
    The maximum-likelihood decoding of convolutional codes has generally been considered impractical for other than relatively short constraint length codes, because of the exponential growth in complexity with increasing constraint length. The soft-decision minimum-distance decoding algorithm proposed in the paper approaches the performance of a maximum-likelihood decoder, and uses a sequential decoding approach to avoid an exponential growth in complexity. The algorithm also utilises the distance and structural properties of convolutional codes to considerably reduce the amount of searching needed to find the minimum soft-decision distance paths when a back-up search is required. This is done in two main ways. First, a small set of paths called permissible paths are utilised to search the whole of the subtree for the better path, instead of using all the paths within a given subtree. Secondly, the decoder identifies which subset of permissible paths should be utilised in a given search and which may be ignored. In this way many unnecessary path searches are completely eliminated. Because the decoding effort required by the algorithm is low, and the decoding processes are simple, the algorithm opens the possibility of building high-speed long constraint length convolutional decoders whose performance approaches that of the optimum maximum-likelihood decoder. The paper describes the algorithm and its theoretical basis, and gives examples of its operation. Also, results obtained from practical implementations of the algorithm using a high-speed microcomputer are presented

    Data transmission with variable-redundancy error control over a high-frequency channel

    Get PDF
    Results of computations and field tests on a binary-data-transmission system, operating at 1 kbaud over an h.f. channel, are presented. Error correction is effected by means of error detection and automatic request for repeat, via a feedback channel (a Post Office private line). A set of short, fixed-block-length cyclic codes is available, a code of appropriate redundancy being automatically selected to match the varying channel conditions. The decision about which code to use is made at the receiver, and the transmitter is informed via the feedback channel. The results show that relatively simple, reliable, and efficient data communication can be realised by this means

    An efficient minimum-distance decoding algorithm for convolutional error-correcting codes

    Get PDF
    Minimum-distance decoding of convolutional codes has generally been considered impractical for other than relatively short constraint length codes, because of the exponential growth in complexity with increasing constraint length. The minimum-distance decoding algorithm proposed in the paper, however, uses a sequential decoding approach to avoid an exponential growth in complexity with increasing constraint length, and also utilises the distance and structural properties of convolutional codes to considerably reduce the amount of tree searching needed to find the minimum-distance path. In this way the algorithm achieves a complexity that does not grow exponentially with increasing constraint length, and is efficient for both long and short constraint length codes. The algorithm consists of two main processes. Firstly, a direct-mapping scheme, which automatically finds the minimum-distance path in a single mapping operation, is used to eliminate the need for all short back-up tree searches. Secondly, when a longer back-up search is required, an efficient tree-searching scheme is used to minimise the required search effort. The paper describes the complete algorithm and its theoretical basis, and examples of its operation are given

    New trapdoor-knapsack public-key cryptosystem

    Get PDF
    The paper presents a new trapdoor-knapsack public-key cryptosystem. The encryption equation is based on the general modular knapsack equation, but, unlike the Merkle-Hellman scheme, the knapsack components do not have to have a superincreasing structure. The trapdoor is based on transformations between the modular and radix form of the knapsack components, via the Chinese remainder theorem. The security is based on factoring a number composed of 256 bit prime factors. The resulting cryptosystem has high density, approximately 30% message expansion and a public key of 14 Kbits. This compares very favourably with the Merkle-Hellman scheme which has over 100% expansion and a public key of 80 Kbits. The major advantage of the scheme when compared with the RSA scheme is one of speed. Typically, knapsack schemes such as the one proposed here are capable of throughput speeds which are orders of magnitude faster than the RSA scheme

    Asymptotics of Quantum Relative Entropy From Representation Theoretical Viewpoint

    Full text link
    In this paper it was proved that the quantum relative entropy D(σρ)D(\sigma \| \rho) can be asymptotically attained by Kullback Leibler divergences of probabilities given by a certain sequence of POVMs. The sequence of POVMs depends on ρ\rho, but is independent of the choice of σ\sigma.Comment: LaTeX2e. 8 pages. The title was changed from "Asymptotic Attainment for Quantum Relative Entropy
    corecore